MOGPTK: The multi-output Gaussian process toolkit

نویسندگان

چکیده

We present MOGPTK, a Python package for multi-channel data modelling using Gaussian processes (GP). The aim of this toolkit is to make multi-output GP (MOGP) models accessible researchers, scientists, and practitioners alike. MOGPTK uses front-end relies on the PyTorch suite, thus enabling GPU-accelerated training. facilitates implementing entire pipeline modelling, including loading, parameter initialization, model learning, interpretation, up imputation extrapolation. implements main covariance kernels from literature, as well spectral-based initialization strategies. source code, tutorials examples in form Jupyter notebooks, together with API documentation, can be found GitHub repository: https://github.com/GAMES-UChile/mogptk.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Dependent Multi-output Gaussian Process Dynamical Systems

This paper presents a dependent multi-output Gaussian process (GP) for modeling complex dynamical systems. The outputs are dependent in this model, which is largely different from previous GP dynamical systems. We adopt convolved multi-output GPs to model the outputs, which are provided with a flexible multi-output covariance function. We adapt the variational inference method with inducing poi...

متن کامل

Multi-output local Gaussian process regression: Applications to uncertainty quantification

We develop an efficient, Bayesian Uncertainty Quantification framework using a novel treed Gaussian process model. The tree is adaptively constructed using information conveyed by the observed data about the length scales of the underlying process. On each leaf of the tree, we utilize Bayesian Experimental Design techniques in order to learn a multi-output Gaussian process. The constructed surr...

متن کامل

Collaborative Multi-output Gaussian Processes

We introduce the collaborative multi-output Gaussian process (GP) model for learning dependent tasks with very large datasets. The model fosters task correlations by mixing sparse processes and sharing multiple sets of inducing points. This facilitates the application of variational inference and the derivation of an evidence lower bound that decomposes across inputs and outputs. We learn all t...

متن کامل

Multi-task Gaussian Process Prediction

In this paper we investigate multi-task learning in the context of Gaussian Processes (GP). We propose a model that learns a shared covariance function on input-dependent features and a “free-form” covariance matrix over tasks. This allows for good flexibility when modelling inter-task dependencies while avoiding the need for large amounts of data for training. We show that under the assumption...

متن کامل

Multi-output separable Gaussian process: Towards an efficient, fully Bayesian paradigm for uncertainty quantification

I. Bilionis et al. / Journal of Computational Physics 241 (2013) 212–239 213 The engineering community and, in particular, the researchers in uncertainty quantification, have been making extensive use of surrogates, even though most times it is not explicitly stated. One example is the so-called stochastic collocation (SC) method (see [1] for a classic illustration) in which the response is mod...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2021

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2020.09.085